An Explicit Example of Leave-One-Out Cross-Validation Parameter Estimation for a Univariate Radial Basis Function
نویسندگان
چکیده
We give an explicit example for the selection of the shape parameter for a certain univariate radial basis function (RBF) interpolation problem.
منابع مشابه
Determining optimal value of the shape parameter $c$ in RBF for unequal distances topographical points by Cross-Validation algorithm
Several radial basis function based methods contain a free shape parameter which has a crucial role in the accuracy of the methods. Performance evaluation of this parameter in different functions with various data has always been a topic of study. In the present paper, we consider studying the methods which determine an optimal value for the shape parameter in interpolations of radial basis ...
متن کاملA RBF Neural Network Modeling Method based on Sensitivity Analysis and Pareto Law
Radial basis function neural network (RBFNN) has been widely used in nonlinear function approximation. In this paper, two limits of RBFNN have been handled which are network complexity and large-scale calculation respectively. Firstly, network complexity, which results from problems of numerous width parameters optimization, is solved by a method of space decomposition based on sensitivity anal...
متن کاملA radial basis function network classifier to maximise leave-one-out mutual information
We develop an orthogonal forward selection (OFS) approach to construct radial basis function (RBF) network classifiers for two-class problems. Our approach integrates several concepts in probabilistic modelling, including cross validation, mutual information and Bayesian hyperparameter fitting. At each stage of the OFS procedure, one model term is selected by maximising the leave-one-out mutual...
متن کاملWithdrawing an example from the training set: An analytic estimation of its effect on a non-linear parameterised model
For a non-linear parameterised model, the effects of withdrawing an example from the training set can be predicted. We focus on the prediction of the error on the left-out example, and of the confidence interval for the prediction of this example. We derive a rigorous expression of the first-order expansion, in parameter space, of the gradient of a quadratic cost function, and specify its valid...
متن کاملLarge-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation
In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...
متن کامل